2‐complexes with unique embeddings in 3‐space
نویسندگان
چکیده
A well-known theorem of Whitney states that a 3-connected planar graph admits an essentially unique embedding into the 2-sphere. We prove 3-dimensional analogue: simply-connected 2-complex every link which is locally flat 3-sphere, if it one at all. This can be thought as generalisation Schoenflies theorem.
منابع مشابه
qfd planning with cost consideration in fuzzy environment
در عصر حاضر که رقابت بین سازمان ها بسیار گسترش یافته است، مطالعه و طرحریزی سیستم های تولیدی و خدماتی به منظور بهینه سازی عملکرد آنها اجتناب ناپذیر می باشد. بخش عمده ای از رقابت پذیری سازمان ها نتیجه رضایتمندی مشتریان آنها است. میزان موفقیت سازمان های امروزی به تلاش آنها در جهت شناسایی خواسته ها و نیازهای مشتریان و ارضای این نیازها بستگی دارد. از طرفی کوتاه کردن زمان ارائه محصول/خدمات به مشتریان...
15 صفحه اولSearch Personalization with Embeddings
Recent research has shown that the performance of search personalization depends on the richness of user profiles which normally represent the user’s topical interests. In this paper, we propose a new embedding approach to learning user profiles, where users are embedded on a topical interest space. We then directly utilize the user profiles for search personalization. Experiments on query logs...
متن کاملStreaming Embeddings with Slack
We study the problem of computing low-distortion embeddings in the streaming model. We present streaming algorithms that, given an n-point metric space M , compute an embedding of M into an n-point metric space M ′ that preserves a (1−σ)-fraction of the distances with small distortion (σ is called the slack). Our algorithms use space polylogarithmic in n and the spread of the metric. Within suc...
متن کاملLearning with Memory Embeddings
Embedding learning, a.k.a. representation learning, has been shown to be able to model large-scale semantic knowledge graphs. A key concept is a mapping of the knowledge graph to a tensor representation whose entries are predicted by models using latent representations of generalized entities. Latent variable models are well suited to deal with the high dimensionality and sparsity of typical kn...
متن کاملParsing with Context Embeddings
We introduce context embeddings, dense vectors derived from a language model that represent the left/right context of a word instance, and demonstrate that context embeddings significantly improve the accuracy of our transition based parser. Our model consists of a bidirectional LSTM (BiLSTM) based language model that is pre-trained to predict words in plain text, and a multi-layer perceptron (...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bulletin of The London Mathematical Society
سال: 2022
ISSN: ['1469-2120', '0024-6093']
DOI: https://doi.org/10.1112/blms.12718